Improving reservoirs using intrinsic plasticity
نویسندگان
چکیده
The benefits of using Intrinsic Plasticity (IP), an unsupervised, local, biologically inspired adaptation rule that tunes the probability density of a neuron’s output towards an exponential distribution – thereby realizing an information maximization – have already been demonstrated. In this work, we extend the ideas of this adaptation method to a more commonly used nonlinearity and a Gaussian output distribution. After deriving the learning rules, we show the effects of the bounded output of the transfer function on the moments of the actual output distribution. This allows us to show that the rule converges to the expected distributions, even in random recurrent networks. The IP rule is evaluated in a Reservoir Computing setting, which is a temporal processing technique which uses random, un-trained recurrent networks as excitable media, where the network’s state is fed to a linear regressor used to calculate the desired output. We present an experimental comparison of the different IP rules on three benchmark tasks with different characteristics. Furthermore, we show that this unsupervised reservoir adaptation is able to adapt networks with very constrained topologies, such as a 1D lattice which generally shows quite unsuitable dynamic behavior, to a reservoir that can be used to solve complex tasks. We clearly demonstrate that IP is able to make Reservoir Computing more robust: the internal dynamics can autonomously tune themselves – irrespective of initial weights or input scaling – to the dynamic regime which is optimal for a given task.
منابع مشابه
Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning
We propose to use a biologically motivated learning rule based on neural intrinsic plasticity to optimize reservoirs of analog neurons. This rule is based on an information maximization principle, it is local in time and space and thus computationally efficient. We show experimentally that it can drive the neurons' output activities to approximate exponential distributions. Thereby it implement...
متن کاملIntrinsic plasticity for reservoir learning algorithms
One of the most difficult problems in using dynamic reservoirs like echo state networks for signal processing is the choice of reservoir network parameters like connectivity or spectral radius of the weight matrix. In this article, we investigate the properties of an unsupervised intrinsic plasticity rule for signal specific adaptive shaping of the reservoir, which is local in space and time an...
متن کاملInformation Theoretic Self-organised Adaptation in Reservoirs for Temporal Memory Tasks
Recurrent neural networks of the Reservoir Computing (RC) type have been found useful in various time-series processing tasks with inherent non-linearity and requirements of temporal memory. Here with the aim to obtain extended temporal memory in generic delayed response tasks, we combine a generalised intrinsic plasticity mechanism with an information storage based neuron leak adaptation rule ...
متن کاملMorphological divergence and flow-induced phenotypic plasticity in a native fish from anthropogenically altered stream habitats
Understanding population-level responses to human-induced changes to habitats can elucidate the evolutionary consequences of rapid habitat alteration. Reservoirs constructed on streams expose stream fishes to novel selective pressures in these habitats. Assessing the drivers of trait divergence facilitated by these habitats will help identify evolutionary and ecological consequences of reservoi...
متن کاملIntrinsic Plasticity via Natural Gradient Descent
This paper introduces the natural gradient for intrinsic plasticity, which tunes a neuron’s activation function such that its output distribution becomes exponentially distributed. The information-geometric properties of the intrinsic plasticity potential are analyzed and the improved learning dynamics when using the natural gradient are evaluated for a variety of input distributions. The appli...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 71 شماره
صفحات -
تاریخ انتشار 2008